Regulatory Perspectives

Although many regulators are receptive to the advanced site characterization information provided by the ISC approach and newer characterization tools, some regulators may not be comfortable with the departure from standard site characterization practices. For regulatory agency personnel who have been operating under what may now be considered outdated conceptual models for subsurface contamination, there is a clear challenge to incorporate the newer views of contaminant behavior into ongoing cleanups. A key area of potential conflict for regulators is the use of potentially less accurate but less expensive characterization and analytical techniques. In many circumstances a high-resolution, high-density data set composed of slightly less accurate data (such as colorimetric test kit data or vapor-phase sampling) can reveal more about a site than a much more limited amount of extremely high-quality data (such as Contract Laboratory Program [CLP] data); however, some activities require more precise data (for example, compliance/site closure).

This chapter discusses some of the potential regulatory acceptance issues associated with the advanced DNAPL site characterization methods—including the new tools and technologies used to develop a more detailed CSM; the types of analyses, decisions, and responses associated with the various types of data collection (which vary depending on site and project circumstances); and reconciling the advancements in site characterization with current regulatory expectations and requirements.

Regulatory Challenges

As discussed above, many of the regulatory challenges regarding advanced site characterization approaches and tools result from unfamiliarity with, and a lack of understanding of, the new methods and changing knowledge base. Some of the prominent issues are discussed below.

Lack of Familiarity and Understanding of Subsurface Dynamics

The advanced understanding of subsurface contaminant behavior (Chapter 3) has yet to be promulgated on a scale sufficient to allow most regulators, as well as investigation/remediation practitioners, to benefit from the new methods and technologies. The improved understanding of subsurface dynamics—provided in the Key Elements (Chapter 4) of Integrated DNAPL Site Characterization Strategies—should be made readily available to regulators. This web-based guidance and the ITRC’s free internet-based training (IBT) is a major step in providing the understanding necessary to put the ISC approach and the use of advanced characterization tools into broad practice.

Objectives-Based Characterization

Many regulators have been accustomed to using qualitative, general site characterization objectives in developing CSMs. Chapter 4 describes the development of specific data collection objectives based on understanding a site’s uncertainty and the spatial resolution (scale) necessary to adequately develop and refine a CSM. Using this approach, characterization activities can be appropriately driven by the objectives, and the objectives can be made as clear, focused, and specific as possible. ITRC champions the use of SMART objectives for remediation of chlorinated solvent-contaminated sites, with or without DNAPL (ITRC 2011b). Although characterization attributes are not necessarily required to meet all SMART objectives, they should be as specific as possible (given what is and is not known about the site) so that a more accurate CSM can be developed. Although objectives-based characterization can be challenging for regulators who are unaccustomed to the newer approach, their concerns can be alleviated by learning how to develop site-specific objectives (ITRC 2011b).

Mass Discharge as a Regulatory Metric

An additional challenge for regulators involves the need to link metrics to each cleanup objective. Regulations rely on the use of concentration-based standards; however, concentration data alone may not provide a sound basis for defining the point at which a cleanup objective is attained. Thus, estimating mass discharge may provide more meaningful supportive data, even though the role of mass flux and mass discharge information in the regulatory decision framework is not clear or consistent. Because mass discharge as a regulatory metric is a fairly new concept (ITRC 2010), many regulators are unclear about how it might relate to concentration-based standards as well as how to use such mass discharge estimates in decision making.

Chapter 3 of ITRC (ITRC 2010) and the associated IBT discuss the benefits and challenges of mass flux and mass discharge data; ITRC (2011b) also discusses the application of mass flux and mass discharge data into DNAPL remediation projects. For example, mass flux and mass discharge data can be used to measure the effectiveness of source remediation; however, defining the role of these values in remedial decision making and performance monitoring remains a challenge, at least in part because the regulatory benefits are not always clear and their role as a potential regulatory metric is not yet consistently accepted. Because of their potential usefulness, site data collection objectives and data gathering activities should consider mass flux and mass discharge data as a more central part of site investigations than they were in the past.

Use of Nontraditional Characterization Methods

Many regulators are accustomed to traditional investigative methods such as using groundwater monitoring wells to collect water samples and borings to collect soil and soil gas samples. In addition, traditional methods are sometimes incorporated into state regulations or reimbursement criteria for liability trust fund. As discussed above, improved understanding of subsurface dynamics requires newer, less traditional characterization methods to define site-specific dynamics; however, the resulting data sets from multiple characterization tools must be integrated with traditional data, such as groundwater monitoring data. Furthermore, as discussed in Chapter 4, monitoring wells are not recommended as primary characterization tools in unconsolidated aquifers because of vertical and volumetric averaging of contaminant concentrations in the boreholes. In many cases, apparent regulatory impediments to the use of newer characterization tools may be addressed through the use of collaborative data sets as well as discretionary use of alternative methods that would result in improved site characterization.

Chapter 4 also discussed another recent characterization method that has been increasingly applied over the past decade: the iterative investigative approach. The iterative approach is presented in the ITRC Triad document (ITRC 2003). While developing this guidance, the iterative approach (involving systematic planning using real-time data and dynamic work plans) has gained much wider acceptance and has been used extensively in the area of emergency response. Use of the iterative approach during characterization of DNAPL sites can determine the cost-effectiveness of the investigation.

As the Triad guidance and IBT training are further promulgated and iterative investigations become more commonplace, these methods will become more widely accepted. In particular, there is a need for more examples of iterative investigations, successful site characterization, and remediation using these newer methods.

A third area of nontraditional characterization is the use of potentially less accurate but also less expensive characterization and analytical techniques (many of which are described in this document). The use of such tools is an important component of ISC, allowing real-time or near real-time data to guide the site investigation. Another considerable advantage of using these newer methods is that, in many cases, a higher-resolution, higher-density data set composed of slightly less accurate data (such as colorimetric test kit data or vapor-phase sampling of groundwater samples) can reveal much more about a site than can a more limited amount of extremely high-quality data (such as CLP-type analytical data). As discussed in the next section, the results of these methods can be used collaboratively with more traditional laboratory data, and they do not preclude the use of traditional laboratory data for compliance/site closure activities.

Use of Collaborative Data Sets to Refine the CSM

Guidance documents historically have not discussed the use of collaborative data sets (Section 1.3) for developing CSMs, and the reliance on such incomplete guidance has not allowed for the development of effective site-specific CSMs. For example, some states may require the use of SW-846 in the collection and analysis of samples from investigations of contaminated sites; however, several of the characterization methods discussed earlier might not meet all of the requirements found in SW-846. Some data, although obtained in a manner not included in SW-846, could provide useful information for site characterization; thus, CSM development would be less efficient if such data were not accepted by regulators. Through integration of all data types, collaborative data sets can be generated and used to enhance the completeness of the CSM.

Data sets that might not be effective in project decision making when considered solely could, when considered together, manage all relevant relational, sampling, and analytical uncertainties to the degree necessary to support defensible decision making. Typically, less expensive analytical methods are used to generate an effective sampling density and real-time turnaround so that an accurate CSM can be constructed and sampling uncertainties managed. Any analytical uncertainty remaining from the data set is then managed by analyzing selected samples (sample representativeness are established via field-based methods) with more rigorous analytical methods to obtain lower quantification limits and analyte-specific results. Collaborative data sets often are not directly comparable, and, if not, should not be mathematically combined. This may be considered a type of weight-of-evidence approach for CSM development. Collaborative data sets are also used to develop field-to-laboratory correlations and field-based decision criteria. This multiple-lines-of-evidence approach enables the CSM to provide a better picture of contaminant transport, storage, and attenuation.

Once again, it appears that this can be addressed by demonstrating to the regulators the effectiveness of using collaborative data sets. This guidance and the ITRC Triad guidance provide documentation on the usefulness of this approach and are provided, along with IBT, to help foster a more informed regulatory community.

Differentiating Between Matrix Storage and DNAPL

In choosing an effective source remedy, it is necessary to differentiate between the residual DNAPL area and the aqueous phase contamination associated with matrix storage. Treatment of DNAPL in a transmissive zone is much different than treatment of back-diffusion aqueous phase contamination from low-transmission zones. If the objective is to treat residual DNAPL, the areal extent of the DNAPL must be understood, and a technology that will destroy, degrade, or mobilize the residual is needed. Where matrix storage contributes some or all of the contamination through back-diffusion, other technologies are required to treat the aqueous phase contamination in the transmissive zone or the stored contamination in the low-permeability zone. The Tool Selection Worksheet offers information on tools that can collect geology, hydrogeology, and chemistry data to make these distinctions; Appendix G provides case examples that can help site managers determine whether their site is dominated by matrix diffusion. Providing the regulators with information and case studies describing the advantages and limitations of the tools that provide the data needed to differentiate between matrix storage of contaminants and DNAPL will help in gaining their acceptance.

Regulatory Benefits of Integrated DNAPL Site Characterization

As much as they are a challenge, there are great benefits to accepting advanced characterization methods and gaining familiarity with collaborative data sets. Current groundwater regulatory policies generally focus on (1) controlling contaminant sources and migration; and (2) protecting/restoring beneficial uses. Early remediation efforts for DNAPL sites demonstrated that pump and treat (groundwater extraction) for contaminant mass removal is inefficient, particularly for depleting contaminant sources where NAPL is present. Furthermore, conventional long-screened monitoring wells yield sample results that are flow-weighted averages (USEPA 2004) and thus miss important spatial variability, presenting regulators with an inaccurate CSM.

The benefits of ISC are described in Section 1.3. For regulators, the most valuable is the environmental benefit of greater performing remedies. Other important benefits to regulators are as follows:

Improved characterization methods, with representative resolution, can clarify non-uniform source distribution, subsurface heterogeneities, and geochemical variations. This should produce a more refined CSM with a more efficient allocation of resources, resulting in greater accuracy in subsurface characterization. These methods will improve remedial design and monitoring and result in a shorter remedial time frame and reduced life cycle costs.

As illustrated in Figure 5-1, a more accurate picture of the subsurface, developed using an ISC approach, gives regulators a higher level of confidence upon which to make remedy decisions.

Benefits of integrated DNAPL site characterization.

USEPA Priority Actions in Climate Adaptation

The USEPA has released a draft Climate Change Adaptation Implementation Plan (USEPA 2014b). As the USEPA moves forward in identifying priority actions to address potential vulnerabilities to climate change, some of the action areas being discussed include (1) increasing engineering controls for contaminant migration at sites where a remedy is constructed; and (2) evaluating remedy effectiveness. These two areas score high vulnerability when exposed to a greater incidence of flooding, hurricanes, drought, wildfires, or other consequences of climate change. The use of ISC may provide states and tribes with an increased understanding of specific vulnerabilities for sites in drought areas, which could affect fractured subsurface clays, or changes to in situ treatment technologies if prolonged flooding occurs.

Public Education and Outreach

There is an inherent challenge in explaining the data-dense outputs of the new characterization methods. Traditional characterization approaches, relying on relatively few soil, sediment, and groundwater samples, were often presented graphically with intense extrapolation between data points. In any outreach setting, higher resolution and attention to the development of scientifically based CSM allows all interested parties to discuss the future of a site based on science; contaminant transport; and long-term protection of drinking water resources, human health, surface water, and the environment.

This challenge can be met by the new ISC approach, which relies on measurements that present an accurate, scientifically based characterization of contaminants’ current effect on the environment as well as insight into how they may behave in the future. Developing remedial goals based on reliable characterization and science-based remedial decision making will generate acceptable outcomes for all interested parties; however, the process must adhere to resource protection, human health and the environment, and the regulatory principles of state and federal programs.

Presentation of the data to the public can be challenging; however, because subsurface cross sections are now developed with frequent data collection points, providing proper representation of preferential flow paths, the public can have greater confidence in the contaminant representations. Where conventional tools and illustrations fell short, the newer three-dimensional representations and fence diagrams that accurately represent the density of data as well as all data collection points present a much more precise CSM. The accuracy and visual aspect of the presentation of data will allow interested parties to easily see what is happening at the site and to trust in the decision making and remedial approach.

Presentation techniques should include the real-time data collected in the field, the science supporting the technology, and the defensible science-based conclusions that can be drawn from the data. Statistical representation of the confidence attributes associated with analytical and data density should be a part of any presentation that supports a CSM. Explanations of the technology, which data it collects, and how it contributes to the CSM should be presented in a format that is easily understood by the public. Public acceptance and understanding of the difficulties with the site will be enhanced if the data, CSM, and interpretation are presented with transparency and in a format that sequentially lays out the decision making processes.